Goto

Collaborating Authors

 positive-unlabeled compression


Positive-Unlabeled Compression on the Cloud

Neural Information Processing Systems

Many attempts have been done to extend the great success of convolutional neural networks (CNNs) achieved on high-end GPU servers to portable devices such as smart phones. Providing compression and acceleration service of deep learning models on the cloud is therefore of significance and is attractive for end users.


Reviews: Positive-Unlabeled Compression on the Cloud

Neural Information Processing Systems

The paper targets the application of network compression using a cloud platform. Instead of uploading all the training data onto the platform, the paper suggests uploading a small portion of data as positive (P) data and use larger datasets already on the platform as unlabeled (U) data. After training a PU classifier, the classifier will be used to select more P data from the U data. And such selected data, together with the original data, are used in a knowledge distillation framework to compress the original network. The experimental results show that the compressed network's performance is close to the original deep neural network trained on all data, on three widely used datasets.


Positive-Unlabeled Compression on the Cloud

Neural Information Processing Systems

Many attempts have been done to extend the great success of convolutional neural networks (CNNs) achieved on high-end GPU servers to portable devices such as smart phones. Providing compression and acceleration service of deep learning models on the cloud is therefore of significance and is attractive for end users. ImageNet), which could be more cumbersome than the network itself and cannot be easily uploaded to the cloud. In this paper, we present a novel positive-unlabeled (PU) setting for addressing this problem. In practice, only a small portion of the original training set is required as positive examples and more useful training examples can be obtained from the massive unlabeled data on the cloud through a PU classifier with an attention based multi-scale feature extractor.


Positive-Unlabeled Compression on the Cloud

Xu, Yixing, Wang, Yunhe, Chen, Hanting, Han, Kai, XU, Chunjing, Tao, Dacheng, Xu, Chang

Neural Information Processing Systems

Many attempts have been done to extend the great success of convolutional neural networks (CNNs) achieved on high-end GPU servers to portable devices such as smart phones. Providing compression and acceleration service of deep learning models on the cloud is therefore of significance and is attractive for end users. ImageNet), which could be more cumbersome than the network itself and cannot be easily uploaded to the cloud. In this paper, we present a novel positive-unlabeled (PU) setting for addressing this problem. In practice, only a small portion of the original training set is required as positive examples and more useful training examples can be obtained from the massive unlabeled data on the cloud through a PU classifier with an attention based multi-scale feature extractor.